A New Approximate Maximal Margin Classification Algorithm
نویسنده
چکیده
A new incremental learning algorithm is described which approximates the maximal margin hyperplane w.r.t. norm p ≥ 2 for a set of linearly separable data. Our algorithm, called almap (Approximate Large Margin algorithm w.r.t. norm p), takes O ( (p−1) α2 γ2 ) corrections to separate the data with p-norm margin larger than (1 − α) γ, where γ is the (normalized) p-norm margin of the data. almap avoids quadratic (or higher-order) programming methods. It is very easy to implement and is as fast as on-line algorithms, such as Rosenblatt’s Perceptron algorithm. We performed extensive experiments on both real-world and artificial datasets. We compared alma2 (i.e., almap with p = 2) to standard Support vector Machines (SVM) and to two incremental algorithms: the Perceptron algorithm and Li and Long’s ROMMA. The accuracy levels achieved by alma2 are superior to those achieved by the Perceptron algorithm and ROMMA, but slightly inferior to SVM’s. On the other hand, alma2 is quite faster and easier to implement than standard SVM training algorithms. When learning sparse target vectors, almap with p > 2 largely outperforms Perceptron-like algorithms, such as alma2.
منابع مشابه
The Role of Weight Shrinking in Large Margin Perceptron Learning
We introduce into the classical perceptron algorithm with margin a mechanism that shrinks the current weight vector as a first step of the update. If the shrinking factor is constant the resulting algorithm may be regarded as a margin-error-driven version of NORMA with constant learning rate. In this case we show that the allowed strength of shrinking depends on the value of the maximum margin....
متن کاملPerceptron-like large margin classifiers
We consider perceptron-like algorithms with margin in which the standard classification condition is modified to require a specific value of the margin in the augmented space. The new algorithms are shown to converge in a finite number of steps and used to approximately locate the optimal weight vector in the augmented space following a procedure analogous to Bolzano’s bisection method. We demo...
متن کاملExact and approximate solutions of fuzzy LR linear systems: New algorithms using a least squares model and the ABS approach
We present a methodology for characterization and an approach for computing the solutions of fuzzy linear systems with LR fuzzy variables. As solutions, notions of exact and approximate solutions are considered. We transform the fuzzy linear system into a corresponding linear crisp system and a constrained least squares problem. If the corresponding crisp system is incompatible, then the fuzzy ...
متن کاملMaximal Margin Classification for Metric Spaces
In this article we construct a maximal margin classification algorithm for arbitrary metric spaces. At first we show that the Support Vector Machine (SVM) is a maximal margin algorithm for the class of metric spaces where the negative squared distance is conditionally positive definite (CPD). This means that the metric space can be isometrically embedded into a Hilbert space, where one performs...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Journal of Machine Learning Research
دوره 2 شماره
صفحات -
تاریخ انتشار 2000